Optimal oracle inequalities for model selection
نویسندگان
چکیده
Abstract: Model selection is often performed by empirical risk minimization. The quality of selection in a given situation can be assessed by risk bounds, which require assumptions both on the margin and the tails of the losses used. Starting with examples from the 3 basic estimation problems, regression, classification and density estimation, we formulate risk bounds for empirical risk minimization under successively weakening conditions and prove them at a very general level, for general margin and power tail behavior of the excess losses.
منابع مشابه
To “ General Non - Exact Oracle Inequalities for Classes with a Subexponential Envelope ”
We apply Theorem A to the problem of Convex aggregation and show that the optimal rate of Convex aggregation for non-exact oracle inequalities is much faster than the optimal rate for exact oracle inequalities. We apply Theorem B to show that regularized procedures based on a nuclear norm criterion satisfy oracle inequalities with a residual term that decreases like 1/n for every Lq-loss functi...
متن کاملOracle inequalities for computationally adaptive model selection
We analyze general model selection procedures using penalized empirical loss minimization under computational constraints. While classical model selection approaches do not consider computational aspects of performing model selection, we argue that any practical model selection procedure must not only trade off estimation and approximation error, but also the computational effort required to co...
متن کاملGeneral Oracle Inequalities for Gibbs Posterior with Application to Ranking
In this paper, we summarize some recent results in Li et al. (2012), which can be used to extend an important PAC-Bayesian approach, namely the Gibbs posterior, to study the nonadditive ranking risk. The methodology is based on assumption-free risk bounds and nonasymptotic oracle inequalities, which leads to nearly optimal convergence rates and optimal model selection to balance the approximati...
متن کاملOptimal kernel selection for density estimation
We provide new general kernel selection rules thanks to penalized least-squares criteria. We derive optimal oracle inequalities using adequate concentration tools. We also investigate the problem of minimal penalty as described in [BM07].
متن کاملSparse oracle inequalities for variable selection via regularized quantization
We give oracle inequalities on procedures which combines quantization and variable selection via a weighted Lasso k-means type algorithm. The results are derived for a general family of weights, which can be tuned to size the influence of the variables in different ways. Moreover, these theoretical guarantees are proved to adapt the corresponding sparsity of the optimal codebooks, suggesting th...
متن کاملNonparametric statistical inverse problems
We explain some basic theoretical issues regarding nonparametric statistics applied to inverse problems. Simple examples are used to present classical concepts such as the white noise model, risk estimation, minimax risk, model selection and optimal rates of convergence, as well as more recent concepts such as adaptive estimation, oracle inequalities, modern model selection methods, Stein’s unb...
متن کامل